Margin-based Decomposed Amortized Inference
نویسندگان
چکیده
Given that structured output prediction is typically performed over entire datasets, one natural question is whether it is possible to re-use computation from earlier inference instances to speed up inference for future instances. Amortized inference has been proposed as a way to accomplish this. In this paper, first, we introduce a new amortized inference algorithm called the Margin-based Amortized Inference, which uses the notion of structured margin to identify inference problems for which previous solutions are provably optimal. Second, we introduce decomposed amortized inference, which is designed to address very large inference problems, where earlier amortization methods become less effective. This approach works by decomposing the output structure and applying amortization piece-wise, thus increasing the chance that we can re-use previous solutions for parts of the output structure. These parts are then combined to a global coherent solution using Lagrangian relaxation. In our experiments, using the NLP tasks of semantic role labeling and entityrelation extraction, we demonstrate that with the margin-based algorithm, we need to call the inference engine only for a third of the test examples. Further, we show that the decomposed variant of margin-based amortized inference achieves a greater reduction in the number of inference calls.
منابع مشابه
Semi-Amortized Variational Autoencoders
Amortized variational inference (AVI) replaces instance-specific local inference with a global inference network. While AVI has enabled efficient training of deep generative models such as variational autoencoders (VAE), recent empirical work suggests that inference networks can produce suboptimal variational parameters. We propose a hybrid approach, to use AVI to initialize the variational par...
متن کاملAmortized Resource Analysis with Polynomial Potential A Static Inference of Polynomial Bounds for Functional Programs (Extended Version)
In 2003, Hofmann and Jost introduced a type system that uses a potential-based amortized analysis to infer bounds on the resource consumption of (first-order) functional programs. This analysis has been successfully applied to many standard algorithms but is limited to bounds that are linear in the size of the input. Here we extend this system to polynomial resource bounds. An automatic amortiz...
متن کاملAmortized Resource Analysis with Polynomial Potential A Static Inference of Polynomial Bounds for Functional Programs
In 2003, Hofmann and Jost introduced a type system that uses a potential-based amortized analysis to infer bounds on the resource consumption of (first-order) functional programs. This analysis has been successfully applied to many standard algorithms but is limited to bounds that are linear in the size of the input. Here we extend this system to polynomial resource bounds. An automatic amortiz...
متن کاملDeep Amortized Inference for Probabilistic Programs
Probabilistic programming languages (PPLs) are a powerful modeling tool, ableto represent any computable probability distribution. Unfortunately, probabilisticprogram inference is often intractable, and existing PPLs mostly rely on expensive,approximate sampling-based methods. To alleviate this problem, one could tryto learn from past inferences, so that future inferences run fa...
متن کاملThe Amortized Bootstrap
We use amortized inference in conjunction with implicit models to approximate the bootstrap distribution over model parameters. We call this the amortized bootstrap, as statistical strength is shared across dataset replicates through a metamodel. At test time, we can then perform amortized bagging by drawing multiple samples from the implicit model. We find amortized bagging outperforms bagging...
متن کامل